Directional Metropolis–Hastings algorithms on hyperplanes
نویسندگان
چکیده
In this paper we define and study new directional Metropolis–Hastings algorithms that propose states in hyperplanes. Each iteration in directional Metropolis–Hastings algorithms consist of three steps. First a direction is sampled by an auxiliary variable. Then a potential new state is proposed in the subspace defined by this direction and the current state. Lastly, the potential new state is accepted or rejected according to the Metropolis–Hastings acceptance probability. Traditional directional Metropolis–Hastings algorithms define the direction by one vector and so the corresponding subspace in which the potential new state is sampled is a line. In this paper we let the direction be defined by two or more vectors and so the corresponding subspace becomes a hyperplane. We compare the performance of directional Metropolis–Hastings algorithms defined on hyperplanes with other frequently used Metropolis–Hastings schemes. The experience is that hyperplane algorithms on average produce larger jumps in the sample space and thereby have better mixing properties per iteration. However, with our implementations the hyperplane algorithms is more computation intensive per iteration and so the simpler algorithms in most cases are better when run for the same amount of computer time. An interesting area for future research is therefore to find variants of our directional Metropolis–Hastings algorithms that require less computation time per iteration.
منابع مشابه
On directional Metropolis-Hastings algorithms
Metropolis–Hastings algorithms are used to simulate Markov chains with limiting distribution equal to a specified target distribution. The current paper studies target densities on R. In directional Metropolis–Hastings algorithms each iteration consists of three steps i) generate a line by sampling an auxiliary variable, ii) propose a new state along the line, and iii) accept/reject according t...
متن کاملNorges Teknisk-naturvitenskapelige Universitet Directional Metropolis–hastings Updates for Posteriors with Nonlinear Likelihoods Directional Metropolis–hastings Updates for Posteriors with Nonlinear Likelihoods
In this paper we consider spatial problems modeled by a Gaussian random field prior density and a nonlinear likelihood function linking the hidden variables to the observed data. We define a directional block Metropolis–Hastings algorithm to explore the posterior density. The method is applied to seismic data from the North Sea. Based on our results we believe it is important to assess the actu...
متن کاملOptimal Proposal Distributions and Adaptive MCMC
We review recent work concerning optimal proposal scalings for Metropolis-Hastings MCMC algorithms, and adaptive MCMC algorithms for trying to improve the algorithm on the fly.
متن کاملOptimal Proposal Distributions and Adaptive MCMC by Jeffrey
We review recent work concerning optimal proposal scalings for Metropolis-Hastings MCMC algorithms, and adaptive MCMC algorithms for trying to improve the algorithm on the fly.
متن کامل